A deep learning approach based on stochastic gradient descent and least absolute shrinkage and selection operator for identifying diabetic retinopathy

نویسندگان

چکیده

More than eighty-five to ninety percentage of the diabetic patients are affected with retinopathy (DR) which is an eye disorder that leads blindness. The computational techniques can support detect DR by using retinal images. However, it hard measure raw image. This paper proposes effective method for identification from In this research work, initially Weiner filter used preprocessing Then preprocessed image segmented fuzzy c-mean technique. image, features extracted grey level co-occurrence matrix (GLCM). After extracting fundus feature selection performed stochastic gradient descent, and least absolute shrinkage operator (LASSO) accurate during classification process. inception v3-convolutional neural network (IV3-CNN) model in process classify as or non-DR By applying proposed method, performance IV3-CNN identifying studied. Using identified accuracy about 95%, processed mild DR.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A new Levenberg-Marquardt approach based on Conjugate gradient structure for solving absolute value equations

In this paper, we present a new approach for solving absolute value equation (AVE) whichuse Levenberg-Marquardt method with conjugate subgradient structure. In conjugate subgradientmethods the new direction obtain by combining steepest descent direction and the previous di-rection which may not lead to good numerical results. Therefore, we replace the steepest descentdir...

متن کامل

Distributed Deep Learning Using Synchronous Stochastic Gradient Descent

We design and implement a distributed multinode synchronous SGD algorithm, without altering hyperparameters, or compressing data, or altering algorithmic behavior. We perform a detailed analysis of scaling, and identify optimal design points for different networks. We demonstrate scaling of CNNs on 100s of nodes, and present what we believe to be record training throughputs. A 512 minibatch VGG...

متن کامل

Online Learning, Stability, and Stochastic Gradient Descent

In batch learning, stability together with existence and uniqueness of the solution corresponds to well-posedness of Empirical Risk Minimization (ERM) methods; recently, it was proved that CVloo stability is necessary and sufficient for generalization and consistency of ERM ([9]). In this note, we introduce CVon stability, which plays a similar role in online learning. We show that stochastic g...

متن کامل

Annealed Gradient Descent for Deep Learning

Stochastic gradient descent (SGD) has been regarded as a successful optimization algorithm in machine learning. In this paper, we propose a novel annealed gradient descent (AGD) method for non-convex optimization in deep learning. AGD optimizes a sequence of gradually improved smoother mosaic functions that approximate the original non-convex objective function according to an annealing schedul...

متن کامل

Comparison between the stochastic search variable selection and the least absolute shrinkage and selection operator for genome-wide association studies of rheumatoid arthritis

BACKGROUND Because multiple loci control complex diseases, there is great interest in testing markers simultaneously instead of one by one. In this paper, we applied two model selection algorithms: the stochastic search variable selection (SSVS) and the least absolute shrinkage and selection operator (LASSO) to two quantitative phenotypes related to rheumatoid arthritis (RA). RESULTS The Gene...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Indonesian Journal of Electrical Engineering and Computer Science

سال: 2022

ISSN: ['2502-4752', '2502-4760']

DOI: https://doi.org/10.11591/ijeecs.v25.i1.pp589-600